I hallucinate. These forced perceptions come to me only at night and are, at times, terrifying. Sometimes, right as I fall to sleep, I open my eyes and see large objects that look like multilegged mutant starfish. They dangle in the air above my head shimmering and swirling, their bodies crazed like pottery. They appear very real to me and it seems as if I could reach out and touch their crackled surfaces. They provoke a mix of emotions: fear, fascination, repulsion, and wonder.
I feel the same way about the billboard that was placed a few weeks ago in the parking lot near my office. It’s an advertisement for exa.ai, featuring a magnificent white Pegasus unicorn flying over cotton candy pink clouds. There’s a clear blue sky in the background, and a luminous arcing rainbow. To me, the image looks like a less saturated and more sophisticated version of Lisa Frank’s artwork. It’s whimsical, sparkly, maximalist, and friendly, with its pastel hues and and fluffy cumuli. The message on the billboard states “Hallucinations are fun…but not from your AI”.
I am an early adopter of AI consumer platforms including ChatGPT, Claude, Perplexity, and Gemini. I find them helpful for mundane tasks such as writing a Google review, or editing my essays. It’s been a pleasure to introduce my friends and family to AI, helping them streamline communication and enhance organization, and it makes me feel a bit like a magician when I employ AI to create an image that’s unlikely to be found on the web. For example, I have an inside joke about pterodactyls with someone who loves her Pure Barre workouts. For the fun of it, I asked ChatGPT to generate an image of a pterodactyl wearing a sweatband and leg-warmers, perspiring as she leaves her exercise class. This is what I got after a few command iterations and refinements:

But image production isn’t always so smooth with AI technologies; we’ve all seen AI created videos and stills showing people with three arms or other physical anomalies. In text content generation, AI sometimes develops responses that contains false or misleading information presented as facts. These are examples of “hallucinations”, incorrect or misleading results generated by AI.
I get the gist of the exa.ai billboard, and the concepts behind its winged unicorn, which contrasts the playful absurdity of human imagination with the undesirable hallucinations produced by unreliable AI systems. But the written copy vexes me: “Hallucinations are fun…but not from your AI”. This rhetoric decontextualizes hallucination, employing the term as a form of harmless wordplay.
I am concerned that this type this type of messaging, the framing of hallucinations as some sort of delightfully altered perceptual artifact, is tone-deaf toward the global mental health crisis. It presents hallucination as a glittery aesthetic rather than a destabilizing neurological phenomena. Perhaps this wouldn’t matter to me…perhaps I could brush it off as awkward copywriting….if the ad didn’t sit just a few feet above the parking lot I walk past every day, where it greets me like an obnoxiously cheerful neighbor who insists on joking about a subject he doesn’t understand.
Hallucinations are not, for many of us, benign imaginative indulgences. They are not pretty pastel cloudscapes populated by magical flying narwhal horses who swoop in on rainbows. True human hallucinations arrive unbidden. They feel real because they are generated by the same perceptual machinery that helps us navigate the world safely. But, in these moments, that machinery has slipped a gear. The outcomes of this errant cognition can be alarming.
The underlying pathophysiologies driving hallucinations are numerous. Some people hallucinate under the influence of medications; others do so in the later stages of neurodegenerative illnesses such as Alzheimers. Individuals with schizophrenia often hallucinate menacing shadow figures or insectoid creatures. My own hypnagogia is startling, and only once in my lifetime lighthearted. My father hallucinated in the last days of his life, describing for me the military aircraft flying though his hospital room. Dad was tied to his bed lest he leap up to engage with his delusions. It was devastating to witness his cognitive decent into delirium.
It’s strange, then, to see a word that has shaped some of my sensory and emotional life rendered as a punchline. Stranger still to see it printed several feet tall, stretched across a billboard, turned into a tagline about software reliability. I understand the intent, that AI shouldn’t hallucinate, and I recognize that exa.ai wants to emphasize the stability of their platform. But the billboard’s joke relies on a gulf between two meanings of the same word, one that might create moments of absurdity, frustration, or humor, and the one that can devastate.
I suppose my reaction comes from the liminal zone between these definitions. Between the shimmering sea creatures that hover above my bed and the memories of my father’s last weeks, when he gazed toward objects only he could see, his eyes tracking aircraft in a sky concealed from me. His hallucinations were decidedly not fun. They were, in those final days, a kind of unraveling.
I don’t expect exa.ai to know this about me, or about my father. I don’t assume the billboard was designed with malice or indifference. But the gleeful certainty of the message…that hallucinations are fun…feels like a billboard written in a parallel world, one where human perception is tidy, harmless, and easy to joke about, and where mental health issues are as insignificant as a splinter in one’s finger.
Most people, of course, don’t hallucinate. They don’t witness hypnagogia in their bedrooms. They don’t see bug people or nebulous forms crouched in the periphery of their vision. And they don’t watch KC135 aircraft looping parabolic arcs above their bed. For many folks, hallucination seems like a metaphorical concept, something adjacent to daydreaming or imagination. But to those of us who inhabit versions of the perceptual spectrum that diverge from the norm, hallucinations carry weight, history, and consequences. They are not merely conceptual errors; they are a lived and often frightening experience.
And maybe that’s the deeper issue here. The advertisement collapses two kinds of hallucinations…the involuntary, neurological kind and the machine-learning kind…into one unified idea. In doing so, it momentarily conflates human vulnerability with technological failure, while also anthropomorphizing computers. But AI hallucinations aren’t hallucinations at all; they are computational mistakes, artifacts of errant predictions. When I hallucinate, it is not a glitch in a training dataset. It is a moment where my brain generates a world-within-the-world, vivid and overwhelming, unbidden and unpredictable.
I don’t think exa.ai intended any cruelty in their marketing strategy. However, intention isn’t the only measure of impact. The billboard reminds me how easily a brand strategy can take a clinical, psychological, or neurological phenomenon and morph it into a novelty, a quip, a visual pun. It makes me think about all the other terms borrowed from psychiatry and neurology: narcissist, OCD, bipolar, triggered, gaslit, crazy, etc, words that are casually tossed into conversation without their history, without their weight, and meaning.
I believe my reaction to the unicorn billboard is shaped by the fact that much of my life is spent in sensory states that exist outside the standard-issue human experience. And maybe this is why the billboard feels, at once, ridiculous and strangely intimate. It picks at something raw in me, neurodivergences I’ve spent years learning to articulate, to manage, and to integrate.
I don’t want exa.ai to take down the billboard. I’m not calling for censorship or apology tours or public shaming or brand cancelling. What I want is for us…the big “US”… to collectively consider the metaphors we borrow from the mind. We need to ask ourselves some provocative questions: what do we flatten when we turn neurological concepts into slogans? What is the cost of condensing neurocognitive differences into quirky tropes? Who benefits from this culturally inept reframing? And who might be standing under the billboard, looking up, feeling that familiar cocktail of fear, fascination, repulsion, and wonder?